feat: configurable touchscreen and touchpad gesture settings#3771
feat: configurable touchscreen and touchpad gesture settings#3771julianjc84 wants to merge 38 commits into
Conversation
| self.gestures | ||
| .as_ref() | ||
| .and_then(|g| g.workspace_switch.as_ref()) | ||
| .map_or(self.natural_scroll, |a| a.natural_scroll || self.natural_scroll) |
There was a problem hiding this comment.
You're using ||, so will per-gesture overrides for this still work?
| self.gestures | ||
| .as_ref() | ||
| .and_then(|g| g.view_scroll.as_ref()) | ||
| .map_or(self.natural_scroll, |a| a.natural_scroll || self.natural_scroll) |
There was a problem hiding this comment.
Same question as previous.
| self.gestures | ||
| .as_ref() | ||
| .and_then(|g| g.overview_toggle.as_ref()) | ||
| .map_or(self.natural_scroll, |a| a.natural_scroll || self.natural_scroll) |
| impl ScrollFactor { | ||
| pub fn h_v_factors(&self) -> (f64, f64) { | ||
| let base_value = self.base.map(|f| f.0).unwrap_or(1.0); | ||
| let base_value = self.base.map(|f| f.0).unwrap_or(0.4); |
There was a problem hiding this comment.
I assume you changed this for the touch-screen gestures, but don't the touch_pad_ gestures also use this? How will they be affected?
There was a problem hiding this comment.
Yes, correct. Last night I separated out touchpad that is hardcoded in.
So now touch, renamed touchscreen and touchpad are separated and fully independent.
There is still the issue with naturalscroll. And why I'm looking at lua config implementation.
There was a problem hiding this comment.
Yes, correct. Last night I separated out touchpad that is hardcoded in. So now touch, renamed touchscreen and touchpad are separated and fully independent.
Got it.
The lua config idea is overkill for most of the config stuff exposed by Niri. The natural-scroll case of here included. My implementation is, for all it's seeming complexity, only able to expose the config that kdl already does well, and the events/commands on the IPC, with some async handlers.
|
Is it possible to also do screen-edge gestures? Shouldn't be too complicated, I would think. |
|
|
Latest push adds:
Actions: "view-scroll", "workspace-switch", "overview-toggle"
|
|
https://github.com/julianjc84/niri/tree/feat/touch-lua Adds an optional Lua scripting layer for touchscreen gesture handling, plus
Why Lua? Wayland compositors are deliberately restrictive about input access — KDL is static — it maps finger counts to fixed actions but can't express
Lua callbacks return simple values (false = fall through, true = handled, |
|
That's pretty cool. Is it possible to do something like have the gesture state be used for creating animated gestures that are not provided by the compositor? Though even for that, embedding Lua for this is simply not in-line with the design of the main project. Maybe exposing that state, which you do through the Lua API/callbacks now, can be done via IPC instead. YaLTeR did have some intent and idea for expanding the IPC surface. This needs a lot of discussion and serious design work. |
|
Thanks for the feedback — took it on board. Dropped the Lua approach entirely and went with a KDL bind system instead. I dont know what i was thinking. The compositor infers continuous vs discrete from the action, focus-workspace-up, focus-column-right, toggle-overview drive animations that track your finger, everything else (like close-window, screenshot) fires once at threshold. If unsure, just bind the action you want. The compositor knows which ones animate and which ones fire. Still interested in the IPC gesture progress idea for driving external widget animations (QuickShell etc). The bind system maps out what gesture state is useful to expose finger count, direction, deltas, pinch spread, continuous progress which could inform an IPC event surface design down the line. Open to feedback on the approach. |
|
This is much better! With the way that's implemented, you can likely move the entire touch-binds child nodes into
The idea, well mine at least, is that we have new events such as GestureBegin, GestureProgress and GestureEnd, with associated context data (such as maybe some tags?). Then external processes can listen on the event stream for those events, match those tags, which we would set in the compositor config and use the event data to drive their custom animations. There's a probably a better spin on this idea though. But seems like a good starting point as any. |
|
That is next level genius. Moving touch-binds into binds {} makes sense and it unlocks modifier combos for free. The same approach should apply to touchpad gestures too dont you think? That would give us one unified bind system: keyboard, touchscreen, and touchpad all sharing the same binds {} block, same action set, same modifier combos. |
|
Perhaps it would better to have a |
|
Moved touch gestures into the main binds {
// Touchscreen gestures
Touch3SwipeUp natural-scroll=true { focus-workspace-up; }
Touch4SwipeUp { toggle-overview; }
Touch3PinchIn { open-overview; }
TouchEdgeLeft sensitivity=0.6 natural-scroll=true { focus-column-right; }
// Modifier + touchscreen (hold Super + swipe)
Mod+Touch3SwipeUp { move-window-to-workspace-up; }
// Touchpad gestures (same system)
TouchpadSwipe3Up { focus-workspace-up; }
// Modifier + touchpad
Mod+TouchpadSwipe3Up { move-window-to-workspace-up; }
}Detection thresholds stay in Per-bind properties:
Full test config (click to expand)binds {
// Touchscreen gestures
Touch3SwipeUp natural-scroll=true { focus-workspace-up; }
Touch3SwipeDown natural-scroll=true { focus-workspace-down; }
Touch3SwipeLeft natural-scroll=true { focus-column-right; }
Touch3SwipeRight natural-scroll=true { focus-column-left; }
Touch4SwipeUp { toggle-overview; }
Touch4SwipeDown { toggle-overview; }
Touch5SwipeDown { close-window; }
Touch5SwipeUp { screenshot; }
Touch3PinchIn { open-overview; }
Touch3PinchOut { close-overview; }
Touch4PinchIn { open-overview; }
Touch4PinchOut { close-overview; }
Touch5PinchIn { close-window; }
Touch5PinchOut { screenshot; }
// Touchscreen Edge gestures
TouchEdgeLeft sensitivity=0.6 natural-scroll=true { focus-column-right; }
TouchEdgeRight sensitivity=0.6 natural-scroll=true { focus-column-left; }
TouchEdgeTop sensitivity=0.5 natural-scroll=true { focus-workspace-up; }
TouchEdgeBottom sensitivity=0.5 natural-scroll=true { focus-workspace-down; }
// Touchpad gestures
TouchpadSwipe3Up { focus-workspace-up; }
TouchpadSwipe3Down { focus-workspace-down; }
TouchpadSwipe3Left { focus-column-right; }
TouchpadSwipe3Right { focus-column-left; }
TouchpadSwipe4Up { toggle-overview; }
TouchpadSwipe4Down { toggle-overview; }
}input {
touchscreen {
gestures {
recognition-threshold 26.0
edge-threshold 30.0
pinch-threshold 20.0
pinch-ratio 2.0
pinch-sensitivity 1.0
finger-threshold-scale 2.6
}
}
touchpad {
tap
dwt
natural-scroll
accel-speed 0.2
accel-profile "adaptive"
scroll-factor 0.9
gestures {
recognition-threshold 16.0
}
}
}Tags are next... |
Good point — |
|
IPC Gesture Events — Implementation & Findings We've implemented tagged gesture events over IPC, building on the Three modes: // Observe — niri runs the action, IPC mirrors live progress
Touch3SwipeUp tag="ws-nav" { focus-workspace-up; }
// IPC-only — gesture emits events, niri does nothing
Touch4SwipeUp tag="app-drawer" { noop; }
// Plain — no tag, no IPC events, just the action
Touch5SwipeDown { close-window; }What the event stream looks like: Pros:
Security consideration:
Cons / Open questions:
Atan-D's original idea of tags for matching gesture events maps well. The |
|
niri-tag-sidebar: Gesture Tag Proof of Concept A GTK4 app that creates slide-out drawer panels driven by niri's IPC gesture events. Swipe from a screen edge, the panel follows your finger. Repo: https://github.com/julianjc84/niri-tag-sidebar Gesture binds with a binds {
TouchEdgeLeft tag="sidebar-left" { noop; }
TouchEdgeRight tag="sidebar-right" { noop; }
}
The sidebar maps Tags also work on touchpad gestures and compositor-animated actions: binds {
TouchpadSwipe3Up tag="ws-nav" { focus-workspace-up; }
Touch3SwipeDown tag="ws-nav" { focus-workspace-down; }
}Known IssuesProgress mismatch on compositor gestures — When a tagged gesture also drives a compositor animation (workspace switch, etc), niri uses its own internal thresholds to decide when to commit. IPC progress is independent and may not match. For Touchscreen vs touchpad scale — Touchscreen deltas are screen pixels (tracks close to niri's internals). Touchpad deltas are libinput acceleration-adjusted units (nonlinear, harder to tune). Both have configurable input {
touchscreen { gestures { gesture-progress-distance 200.0 } } // screen pixels
touchpad { gestures { gesture-progress-distance 40.0 } } // libinput units
}GestureEnd Configstouchscreen-gestures.kdlinput {
touchscreen {
gestures {
// Detection tuning
recognition-threshold 26.0
edge-threshold 30.0
pinch-threshold 20.0
pinch-ratio 2.0
pinch-sensitivity 1.0
finger-threshold-scale 2.6
gesture-progress-distance 200.0 // screen pixels for IPC progress 0→1
}
}
}
binds {
// Touchscreen gestures
Touch3SwipeUp natural-scroll=true tag="ws-nav" { focus-workspace-up; }
Touch3SwipeDown natural-scroll=true tag="ws-nav" { focus-workspace-down; }
Touch3SwipeLeft natural-scroll=true tag="col-nav" { focus-column-right; }
Touch3SwipeRight natural-scroll=true tag="col-nav" { focus-column-left; }
Touch4SwipeUp { toggle-overview; }
Touch4SwipeDown { toggle-overview; }
Touch5SwipeDown tag="test-discrete" { close-window; }
Touch5SwipeUp tag="test-discrete" { screenshot; }
Touch3PinchIn { open-overview; }
Touch3PinchOut { close-overview; }
Touch4PinchIn { open-overview; }
Touch4PinchOut { close-overview; }
Touch5PinchIn { close-window; }
Touch5PinchOut { screenshot; }
// TOUCHSCREEN EDGE
// TouchEdgeLeft sensitivity=0.6 natural-scroll=true { focus-column-right; }
// TouchEdgeRight sensitivity=0.6 natural-scroll=true { focus-column-left; }
// TouchEdgeTop sensitivity=0.5 natural-scroll=true { focus-workspace-up; }
// TouchEdgeBottom sensitivity=0.5 natural-scroll=true { focus-workspace-down; }
// Edge swipes for niri-tag-sidebar (IPC-only, no compositor action)
TouchEdgeLeft tag="sidebar-left" { noop; }
TouchEdgeRight tag="sidebar-right" { noop; }
TouchEdgeTop tag="sidebar-top" { noop; }
TouchEdgeBottom tag="sidebar-bottom" { noop; }
// Mod+Touchscreen gestures (test — hold Super + touch gesture)
Mod+Touch3SwipeUp { spawn "notify-send" "Mod+Touch" "3-finger Swipe Up"; }
Mod+Touch3SwipeDown { spawn "notify-send" "Mod+Touch" "3-finger Swipe Down"; }
Mod+Touch3SwipeLeft { spawn "notify-send" "Mod+Touch" "3-finger Swipe Left"; }
Mod+Touch3SwipeRight { spawn "notify-send" "Mod+Touch" "3-finger Swipe Right"; }
Mod+Touch3PinchIn { spawn "notify-send" "Mod+Touch" "3-finger Pinch In"; }
Mod+Touch3PinchOut { spawn "notify-send" "Mod+Touch" "3-finger Pinch Out"; }
Mod+Touch4SwipeUp { spawn "notify-send" "Mod+Touch" "4-finger Swipe Up"; }
Mod+Touch4SwipeDown { spawn "notify-send" "Mod+Touch" "4-finger Swipe Down"; }
Mod+Touch4SwipeLeft { spawn "notify-send" "Mod+Touch" "4-finger Swipe Left"; }
Mod+Touch4SwipeRight { spawn "notify-send" "Mod+Touch" "4-finger Swipe Right"; }
Mod+Touch4PinchIn { spawn "notify-send" "Mod+Touch" "4-finger Pinch In"; }
Mod+Touch4PinchOut { spawn "notify-send" "Mod+Touch" "4-finger Pinch Out"; }
Mod+Touch5SwipeUp { spawn "notify-send" "Mod+Touch" "5-finger Swipe Up"; }
Mod+Touch5SwipeDown { spawn "notify-send" "Mod+Touch" "5-finger Swipe Down"; }
Mod+Touch5SwipeLeft { spawn "notify-send" "Mod+Touch" "5-finger Swipe Left"; }
Mod+Touch5SwipeRight { spawn "notify-send" "Mod+Touch" "5-finger Swipe Right"; }
Mod+Touch5PinchIn { spawn "notify-send" "Mod+Touch" "5-finger Pinch In"; }
Mod+Touch5PinchOut { spawn "notify-send" "Mod+Touch" "5-finger Pinch Out"; }
Mod+TouchEdgeLeft { spawn "notify-send" "Mod+Edge" "Left"; }
Mod+TouchEdgeRight { spawn "notify-send" "Mod+Edge" "Right"; }
Mod+TouchEdgeTop { spawn "notify-send" "Mod+Edge" "Top"; }
Mod+TouchEdgeBottom { spawn "notify-send" "Mod+Edge" "Bottom"; }
}touchpad-gestures.kdlinput {
touchpad {
tap
dwt
natural-scroll
accel-speed 0.2
accel-profile "adaptive"
scroll-factor 0.9
gestures {
recognition-threshold 16.0
gesture-progress-distance 250.0 // libinput delta units for IPC progress 0→1
}
}
}
binds {
// Touchpad gestures (plain — no modifier)
TouchpadSwipe3Up tag="ws-nav" { focus-workspace-up; }
TouchpadSwipe3Down tag="ws-nav" { focus-workspace-down; }
TouchpadSwipe3Left tag="col-nav" { focus-column-right; }
TouchpadSwipe3Right tag="col-nav" { focus-column-left; }
TouchpadSwipe4Up { toggle-overview; }
TouchpadSwipe4Down { toggle-overview; }
// Mod+Touchpad gestures (test — hold Super + swipe triggers notification)
Mod+TouchpadScrollUp { spawn "notify-send" "Scroll" "2-finger Up"; }
Mod+TouchpadScrollDown { spawn "notify-send" "Scroll" "2-finger Down"; }
Mod+TouchpadScrollLeft { spawn "notify-send" "Scroll" "2-finger Left"; }
Mod+TouchpadScrollRight { spawn "notify-send" "Scroll" "2-finger Right"; }
Mod+TouchpadSwipe3Up { spawn "notify-send" "Swipe" "3-finger Up"; }
Mod+TouchpadSwipe3Down { spawn "notify-send" "Swipe" "3-finger Down"; }
Mod+TouchpadSwipe3Left { spawn "notify-send" "Swipe" "3-finger Left"; }
Mod+TouchpadSwipe3Right { spawn "notify-send" "Swipe" "3-finger Right"; }
Mod+TouchpadSwipe4Up { spawn "notify-send" "Swipe" "4-finger Up"; }
Mod+TouchpadSwipe4Down { spawn "notify-send" "Swipe" "4-finger Down"; }
Mod+TouchpadSwipe4Left { spawn "notify-send" "Swipe" "4-finger Left"; }
Mod+TouchpadSwipe4Right { spawn "notify-send" "Swipe" "4-finger Right"; }
Mod+TouchpadSwipe5Up { spawn "notify-send" "Swipe" "5-finger Up"; }
Mod+TouchpadSwipe5Down { spawn "notify-send" "Swipe" "5-finger Down"; }
Mod+TouchpadSwipe5Left { spawn "notify-send" "Swipe" "5-finger Left"; }
Mod+TouchpadSwipe5Right { spawn "notify-send" "Swipe" "5-finger Right"; }
}niri-tag-sidebar config (sample-config.toml)# niri-tag-sidebar configuration
#
# Each [[panel]] defines a slide-out drawer bound to a gesture tag.
# The tag must match a `tag="..."` property on a gesture bind in your niri config.
#
# layer options: "overlay" (above waybar), "top" (same as waybar), "bottom", "background"
[[panel]]
tag = "sidebar-left"
edge = "left"
size = 400
snap_threshold = 0.4
bg_color = "rgba(40, 100, 220, 0.95)"
label = "Navigation"
layer = "overlay"
[[panel]]
tag = "sidebar-right"
edge = "right"
size = 400
snap_threshold = 0.5
bg_color = "rgba(220, 50, 50, 0.95)"
label = "Quick Settings"
layer = "overlay"
[[panel]]
tag = "sidebar-top"
edge = "top"
size = 250
snap_threshold = 0.3
bg_color = "rgba(50, 180, 80, 0.95)"
label = "Notifications"
layer = "overlay"
[[panel]]
tag = "sidebar-bottom"
edge = "bottom"
size = 300
snap_threshold = 0.6
bg_color = "rgba(220, 170, 30, 0.95)"
label = "Media Controls"
layer = "overlay"
# Progress bar — shows real-time gesture progress for workspace navigation
# Appears during 3-finger swipe (ws-nav tag) and auto-hides on release
[[panel]]
tag = "ws-nav"
edge = "bottom"
style = "bar"
bar_height = 40
bg_color = "rgba(100, 60, 200, 0.90)"
label = "Workspace"
layer = "overlay"
# Progress bar — shows real-time gesture progress for column navigation
# Appears during 3-finger horizontal swipe (col-nav tag) and auto-hides on release
[[panel]]
tag = "col-nav"
edge = "bottom"
style = "bar"
bar_height = 40
bg_color = "rgba(200, 120, 40, 0.90)"
label = "Column"
layer = "overlay"What Tags EnableSlide-out panels, gesture-driven launchers, custom workspace indicators, scrubbing controls (volume/brightness), accessibility tools — anything that wants live gesture progress without modifying the compositor. |
|
Would you be able to provide a demo video of the app in motion..? |
Yes, absolute, later today i will |
|
That is awesome! |
Your ideas and guidance with staying on kdl +tags were instrumental in getting to this working proof of concept stage so thank you. |
|
Well I'm still not fully happy with the design, because we have a sort of separation between where Niri configures the setup for those binds and where that setup is used to implement the action itself by the external app. I was also thinking of something like having the spawn action have some different behavior when called with these binds. Attaching state to the process being spawned via environment variables, that the spawned process can then read and use to drive animations. This way, the state stays fully isolated, does not need to exposed over IPC, and removes the need for tags/tagged-events. Though I'm unsure how non-trivial it is, modifying the spawn action in that way. This is already pretty awesome, but I think we can still do better. |
|
I updated the Repo: https://github.com/julianjc84/niri-tag-sidebar app to accept touch so it can be closed now for a better UI experience. |
|
Thanky so much for the video ! Our thoughts (as ones daily driving a tablet on niri) from this PR:
Everything else to me seems to be working fine, I did not test tags as I am still unsure about how to utilize those, but otherwise lovely work ! if you ever want our feedback or testing please feel free to ping |
Write-up of the design choices behind this branch's touchscreen gesture
stack, framed explicitly as a working prototype shaped by reviewer
feedback — not as niri's canonical design. Intended as a concrete
reference point for discussion, not a prescription.
Covers:
- What Wayland provides (wl_touch raw, wp_pointer_gestures_v1 for
touchpad) and the touchscreen gap
- Why touchscreen gesture recognition doesn't live in libinput
(direct manipulation, ambiguous recipient)
- How iOS UIKit, Android's systemGestureExclusionRects, Linux phone
shells, and userspace daemons (TouchEgg/lisgd/InputActions) solve
or fail to solve the same problem
- The specific choices this branch makes with labeled rationale:
compositor-side recognizer, unified binds {} block, tag+IPC events,
continuous noop, touchscreen-gesture-passthrough window rule,
Mod+edge escape hatches, per-edge zoned triggers
- Alternatives considered and rejected (dynamic client dialog,
allow-forwarding per bind, zone granularity, auto-detection,
external daemon, global disable toggle) with reasoning
- Future directions: a minimal Wayland protocol modeled on Android's
exclusion rects, unifying IPC progress with internal commit
threshold, touchpad passthrough sibling rule, accurate completed
flag on GestureEnd
- Open questions explicitly inviting reviewer pushback
The doc's framing block makes clear this is one possible approach, not
the approach, and that the whole point of writing it down is to make
the rationale arguable rather than buried in code.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add TouchRotate3/4/5 Cw/Ccw triggers alongside the existing swipe and
pinch families (same hardcoded-per-finger-count shape — true arbitrary
N-finger parameterization is still open in TODO §6). Rotation is
detected from the averaged per-finger angle change around the cluster
centroid; cross-finger averaging gives a √N noise-floor improvement
over single-finger angular drift. Rotation classification runs before
pinch and swipe so a clearly twisting cluster wins over incidental
spread or translation.
Tuning lives under input { touchscreen { gestures { } } }:
rotation-threshold (min radians to latch), rotation-ratio (how much
rotation arc length must dominate swipe/spread by), and
rotation-progress-distance (radians that map to IPC progress ±1.0).
While here, refactor GestureProgress's overloaded delta_x / delta_y
fields into a typed GestureDelta enum (Swipe { dx, dy } / Pinch
{ d_spread } / Rotate { d_radians }) so each gesture family can carry
its own natural shape instead of stuffing rotation radians into an
empty y slot.
NOTE: rotation detection is an early PoC and is currently buggy and
intermittent on real hardware — recognition can misfire, lock at the
wrong finger count, or fail to latch. The math, IPC, unit tests, and
bind plumbing are all in place, but real-world tuning and edge cases
(lock grace period, finger-lift rebasing under rotation, noise
thresholds) still need debugging. Parking it here so the rest of the
work lands and the rotation pass can resume later.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Collapse the 40 hardcoded Touch*/Touchpad* enum variants into 5
parameterized struct variants driven by KDL properties, so the
compositor no longer hardcodes the legal finger counts into the trigger
name. User configs go from:
TouchSwipe3Up { focus-workspace-up; }
TouchEdgeTop:Left { spawn "..."; }
TouchpadSwipe4Left { ... }
to the property form:
TouchSwipe fingers=3 direction="up" { focus-workspace-up; }
TouchEdge edge="top" zone="left" { spawn "..."; }
TouchpadSwipe fingers=4 direction="left" { ... }
Each family keeps its own knuffel-validated property set, so invalid
combinations fail at parse time:
- TouchSwipe / TouchpadSwipe: fingers=N (3..=10), direction="up|down|left|right"
- TouchPinch: fingers=N (3..=10), direction="in|out"
- TouchRotate: fingers=N (3..=10), direction="cw|ccw"
- TouchEdge: edge="left|right|top|bottom", optional zone=
(zone vocab rotates per edge axis: top/bottom edges take
left|center|right; left/right edges take top|center|bottom).
Why:
- Arbitrary finger counts (up to 10) are now supported with zero enum
churn. Touchscreens and large trackpads that report 6+ contacts are
no longer capped at 5.
- Syntax matches the rest of niri's KDL config, which already uses
property-based attributes everywhere (`tag=`, `natural-scroll=`,
`cooldown-ms=`, `allow-when-locked=`). Touch triggers were the one
family that hardcoded structural parameters into the node name.
- The old "can't distinguish finger counts at bind lookup" objection
(from an earlier half-design where the count lived on `Bind` instead
of inside `Trigger`) is moot: putting `fingers` inside the `Trigger`
struct variant keeps `Eq`/`Hash`/bind lookup unchanged.
Hard break — no legacy compat:
- Old `TouchSwipe3Up` / `TouchEdgeTop:Left` / `TouchpadSwipe4Left` etc.
no longer parse. Users get a clear parse error pointing at the new
property form.
- IPC `GestureBegin.trigger` string wire format also changes: events
now echo the same property shape that users write in binds, e.g.
`TouchSwipe fingers=3 direction="up"`. Consumers doing string-match
on trigger names must adapt.
Implementation:
- `niri-config/src/binds.rs`: Trigger enum refactor with 5 struct
variants, new `SwipeDirection` / `PinchDirection` / `RotateDirection`
helper enums, shared `MIN_FINGERS` / `MAX_FINGERS` constants (3..=10),
private `GestureTriggerProps` + `build_gesture_trigger()` builder
with per-family validation, `Bind::decode_node` routes gesture-family
node names to property-based parsing while keyboard / mouse / wheel /
TouchpadScroll keep the FromStr fast path.
- `niri-config/src/input.rs`: `ScreenEdge::as_kdl_name()` method and
module-level `zone_kdl_name(edge, zone)` helper as the single source
of truth for the axis-rotating zone vocabulary. Parser round-trips
through `zone_kdl_name` instead of hand-rolling the inverse mapping.
- `src/input/touch_gesture.rs`: recognition constructs the new struct
variants directly from (gesture kind, finger count); `MIN_FINGERS` /
`MAX_FINGERS` imported from niri-config so there's one range source.
`trigger_to_ipc_name` takes `Trigger` directly (was `Option<Trigger>`),
generates the property-form string dynamically, uses `zone_kdl_name`
and `ScreenEdge::as_kdl_name`, and `debug_assert!`s loudly if it's
ever called with a non-gesture trigger. Trivial `edge_to_trigger` /
`edge_zone_to_trigger` wrappers inlined at their two call sites.
- `src/input/mod.rs`: `swipe_trigger()` accepts any N in
`MIN_FINGERS..=MAX_FINGERS`, not just 3/4/5. The duplicate
`touchpad_trigger_ipc_name` is deleted; its caller routes through
`touch_gesture::trigger_to_ipc_name` (which already handled
`TouchpadSwipe`).
- `src/ui/hotkey_overlay.rs`: display labels generated from fingers +
direction via helper fns, with `format_touch_edge_label` routing
through the shared `zone_kdl_name` / `as_kdl_name` helpers.
- `niri-ipc/src/lib.rs`: `GestureBegin.trigger` docstring updated with
examples of every new family string form.
Note on duplicate properties: knuffel 3.2 stores `SpannedNode::properties`
as `BTreeMap<SpannedName, Value>` and silently takes the last value on
name collision (see knuffel-3.2.0/src/grammar.rs:566). So
`TouchSwipe fingers=3 fingers=5 direction="up"` resolves to `fingers=5`
with no error. This is knuffel-spec behavior, not something niri can
intercept without a separate pre-parse pass over raw KDL source. The
observed last-wins behavior is documented in a regression test.
Tests:
- 15 unit tests in `niri-config/src/binds.rs` exercise
`build_gesture_trigger` directly: arbitrary fingers=3..=10, per-family
direction validation, edge zone vocab rejection, fingers out of range.
- 16 integration tests go through `Config::parse_mem`, covering the
full Bind::decode_node two-phase parse path: modifier stripping
(`Mod+TouchSwipe ...`, multi-modifier `Mod+Shift+TouchEdge ...`),
tag on gesture vs keyboard-bind-rejection, gesture property on
keyboard bind rejection, unknown property rejection, TouchEdge with
zone and vocab mismatch rejection, TouchpadSwipe with modifier,
TouchRotate, TouchPinch direction="in" and "out", fingers-out-of-range,
cross-family direction rejection (swipe rejecting "cw"), and the
knuffel duplicate-property last-wins behavior.
- Full niri test suite: 198 pass.
Docs:
- `docs/wiki/Gestures.md`: rewritten to document the four node families
with property grammar and examples, including a rotation section
that warns about the early-PoC state of rotation recognition.
- `docs/wiki/Configuration:-Input.md`: touchscreen / touchpad tuning
subblocks updated with the new bind form.
- `docs/wiki/Design:-Touchscreen-Gestures.md` §5.2: rewritten. The
prior rationale explicitly argued *against* `fingers=N` and for
hardcoded node names; that position is now reversed, with the new
section explaining what changed and why.
- `resources/default-config.kdl`: user-facing commented examples
migrated to the new syntax so first-install users don't copy broken
snippets.
Pre-existing unrelated issue not touched by this commit: there is a
stale `tests::parse` insta snapshot in `niri-config/src/lib.rs` from
earlier commits that added tag / sensitivity / natural-scroll fields to
`Bind` without refreshing the snapshot. Needs a separate
`cargo insta review` pass.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Debug-log driven investigation of 5-finger rotation reliability (details
in ~/aaaLOG_Niri analyzer output) surfaced three independent bugs that
together kept drift rotations at 0% and clean rotations at ~80%.
Bug 1 - finger-up spread basis discontinuity
src/input/touch_gesture.rs on_touch_up
The rotation basis was rebased on finger-lift but the spread basis
was not. When a user lifted a finger during unlocked recognition,
current_spread jumped (geometry change, not finger motion) and
spread_change = (current - initial).abs() spiked past pinch_threshold
on the very next frame, latching a spurious PinchIn. Observed as
finger-lift retries of 5-finger rotations turning into unwanted
PinchIn at fingers=4.
Fix: during unlocked recognition with 3+ fingers still down, rebase
initial_spread to the post-removal geometry so spread_change resets
to zero across the discontinuity. The locked-pinch rebase already
handles the other case.
Bug 2 - swipe LOCK has no dominance check
src/input/touch_gesture.rs FRAME classification
is_rotate required rotation_arc to dominate swipe_distance by the
rotation_ratio, but the swipe LOCK branch was a naked threshold
crossing (swipe_distance >= threshold). Result: any hand drift over
the swipe threshold (~110 px for 5 fingers) instantly latched as a
swipe, even when the user was visibly rotating with rot > 40 deg and
arc > 250 px. The thresholds raced and swipe always won because its
check was strictly weaker.
Fix: gate the swipe LOCK branch behind !rotation_candidate, where
rotation_candidate = (finger_count >= 3 && rotation_arc >=
rotation_arc_threshold). Once rotation arc is at candidate quality,
swipe cannot steal via the race - it only latches when rotation is
not a plausible classification.
Bug 3 - rotation_ratio default inverted vs intent
niri-config/src/input.rs rotation_ratio
Default was 0.5, and the is_rotate check does rotation_arc >=
swipe_distance * (1.0 / rotation_ratio), i.e. arc >= 2 * swipe. For
a user rotating at 5 fingers with natural hand drift, this gate is
essentially unreachable - arc is bounded by finger reach, swipe is
not. Meanwhile pinch_ratio=2.0 is used directly without inversion,
so the two ratios have opposite semantics for the same field name
(higher=stricter for pinch, higher=lenient for rotate).
Fix: bump default from 0.5 to 2.0, giving an effective gate of
arc >= swipe * 0.5 (rotation arc only needs to be half the swipe
distance). Doc comment rewritten to reflect the leniency semantics.
Config UX - angle values now in degrees
niri-config/src/input.rs rotation_threshold, rotation_progress_distance
rotation-threshold was a raw radian value (default 0.2618) and
rotation-progress-distance was another (default 1.5708 = pi/2).
Users reading the KDL had to convert degrees to radians in their
head. Switched both to degrees in the config surface - accessors
convert to radians internally, callers see no change. Defaults are
now 15.0 and 90.0. Docs updated.
Telemetry bars in debug logs
No new code. The existing TOUCH-DBG FRAME debug_log site at line
~819 already emits per-frame swipe/spread/rot/arc/is_rotate/closest
values which is what the investigation was based on - leaving it as
is so future debug cycles can reuse the journalctl -> analyze.py
workflow.
Results on the four-scenario benchmark (5 finger clean/drift/retry):
baseline: 47% (9/19) - 5 stolen by swipe, 2 false PinchIn, 1 swipe-steal
t3 (bug1+bug2): 57% - drift 0% -> 56%, retry correctly no-motion
t4 (+ratio fix): 75% - all remaining failures are user under-rotation
(rot < 15 deg threshold) or extreme drift
(arc/swipe < 0.5)
Adds Event::RecognitionFrame (debug-only IPC, gated on debug_assertions)
so external tools like niri-gesture-inspector can visualize the
touchscreen gesture classifier state per frame. Also lands an overhaul
of the classifier itself.
Knob rename for clarity:
- All touchscreen + touchpad gesture knobs renamed to a unified
vocabulary: `*-trigger-distance` (commit gates), `*-progress-distance`
(IPC scaling), `*-dominance-ratio` (race tuning, "higher = stricter"
for both pinch and rotation), `*-multi-finger-scale` (per-finger
multipliers). Same knob names span both touchscreen and touchpad
blocks; defaults differ because units differ (px vs libinput delta).
- IPC RecognitionFrame + GestureBegin field names match the new
knob names.
- Doc + default-config.kdl examples updated to match.
Pinch commit-gate bugs fixed:
- The commit gate `(is_pinch && spread_change >= swipe_trigger)` was
double-gating against the SCALED swipe trigger, so 4/5-finger pinches
inherited a wildly inflated commit threshold (e.g. 234/378 px at
swipe-multi-finger-scale 2.6) even though pinch-trigger-distance
stayed flat. Pinch now commits on `is_pinch` alone — pinch-trigger
is the single threshold for the pinch path.
- The pinch dominance check had a hardcoded ratio-1.0 fallback ORed
into the primary gate, capping the effective dominance ratio at
min(knob, 1.0). Any setting > 1.0 was silently masked. Removed the
fallback so pinch-dominance-ratio is now a real knob across its full
range (lower = lenient, higher = stricter).
Default tuning (touchscreen):
- swipe-trigger-distance: 16 → 100
- pinch-trigger-distance: 30 → 100
- pinch-dominance-ratio: 2.0 → 1.0
- swipe-multi-finger-scale: 1.5 → 1.2 (small pinch-priority bias at
4/5-finger so ambiguous motions
resolve as pinch over swipe)
- edge-start-distance: 20 → 30
- rotation-trigger-angle: 15° → 20°
GestureBegin/End emit unconditionally for multi-finger commits:
Multi-finger touchscreen gesture commits now emit GestureBegin/GestureEnd
unconditionally, with an empty `tag` for binds that have no user-set
tag. Previously these events were gated behind `tag.is_some()`, so
debug tools (niri-gesture-inspector) only saw the lock for binds the
user happened to have tagged — making "last lock" stick on the most
recent tagged swipe forever, even when the user was actually
pinching/rotating untagged binds. External consumers filter on tags
they care about, so empty-tag events are a no-op for them. Edge swipes
and touchpad gestures still emit only for tagged binds.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add TouchTap { fingers } trigger for binding stationary N-finger taps
(3-10 fingers). Tap detection runs in parallel with swipe/pinch/rotate
recognition using a spatial dead zone — matching the approach used by
Android, iOS, libinput, and Windows.
Tap candidate starts when 3+ fingers land, is killed if any finger
drifts beyond tap-wobble-threshold (default 15px) or when the
recognizer locks a motion gesture, and fires when all fingers lift
within tap-timeout-ms (default 250ms). No latency added to swipe
recognition.
Config: `tap-wobble-threshold` and `tap-timeout-ms` in the
`touchscreen { gestures { } }` block.
Also fixes a pre-existing bug where `touchscreen_gesture_passthrough`
was not cleared in `on_touch_cancel`, which could permanently suppress
gesture recognition after a cancel event on a passthrough window.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Defer gesture recognition and continuous animation feeding from per-slot on_touch_motion to on_touch_frame. libinput delivers separate TouchMotion events per finger per hardware scan — previously each one triggered spread/rotation recalculation, animation feeding, and queue_redraw_all(). With 3 fingers at 120Hz this caused 360 gesture updates/sec instead of 120. Now on_touch_motion only updates positions and accumulates deltas. on_touch_frame runs recognition + feed once per scan with all positions final — more efficient and more accurate (spread/rotation computed on complete geometry instead of partially-updated state). Also adds TouchTap documentation to the wiki Gestures page. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…TouchpadTapHoldDrag) Add two new touchpad gesture triggers using libinput's hold gesture API: - TouchpadTapHold: fires on release after stationary hold (3+ fingers). Fast taps pass through to clients (e.g. terminal 3-finger paste). - TouchpadTapHoldDrag: fires when held fingers start moving, reusing the existing swipe bind infrastructure for continuous animations. Distinguishes from direct swipes via the hold-before-move pause. Also fixes the pre-existing tests::parse snapshot failure (touch → touchscreen rename, missing fields on Bind and WindowRule structs). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add TouchTapHoldDrag trigger for touchscreen — fires when fingers hold stationary (tap candidate alive) then start moving past wobble threshold. Distinguishes from direct swipes via tap-hold-trigger-delay-ms (default 150ms): fast swipes bypass hold-drag and enter normal recognition. Supports optional direction= property for per-direction binds after hold (directional checked first, omnidirectional fallback). Can drive continuous actions via existing ActiveTouchBind::Swipe infrastructure. New config knob: tap-hold-trigger-delay-ms in touchscreen gestures block. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- edge-start-distance: 30 → 12 (narrower edge zone feels more natural) - tap-timeout-ms: 250 → 500 (allows slightly deliberate taps) - tap-hold-trigger-delay-ms: 150 → 200 (reduces fast-swipe false positives) Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add the Gesture IPC Refactor RFC to the wiki — design plan for replacing
the tag system with env-var spawn context, stdin progress pipes, a public
IPC event stream, noop=consume semantics, and per-window binds {} in
window-rules for fingers=1/2 disambiguation. Sourced from PR niri-wm#3771 review
discussion (Atan-D-RP4 originated the core ideas).
Also convert the Status banner on Design:-Touchscreen-Gestures.md from a
bold-prefixed quote to a [!IMPORTANT] admonition, per the docs policy in
Development:-Documenting-niri.md.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Per Development:-Developing-niri.md, debug! is for normal-operation messages whose absence shouldn't hurt UX, while trace! is for spammy / performance-intensive diagnostics — and trace! is compiled out of release builds. TOUCH-DBG FRAME fires every motion event during gesture recognition (multiple times per frame, every gesture). That's textbook trace! — useful when actively tuning the recognizer, but not something a release build should pay for. The other TOUCH-DBG tags (FINGER-LAND, UNLOCK, LOCK, TAP) stay at debug! since they fire at gesture lifecycle events, which are rare and fit the "normal operation" definition. To see FRAME logs in dev builds, use RUST_LOG=niri=trace. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…lots
Multi-finger gesture onset forwards wl_touch.down for the first two
fingers before the 3rd finger flips the gesture gate. The matching up
events are then suppressed by that same gate — leaving clients with
two phantom touch slots, observed as GTK/Qt dialog buttons ignoring
single-finger taps until a two-finger tap happens to reuse those slot
IDs and drain the sequences.
Track forwarded slots in `Niri::touch_forwarded_slots` and at the
transition into gesture tracking emit explicit handle.up for each
previously-forwarded slot plus handle.cancel + handle.frame so the
client can't hold them as phantoms. Clear on all-fingers-up and on
TouchCancel.
The existing handle.cancel at the swipe/pinch/rotate LOCK path still
covers the case where LOCK fires during motion without a new finger
DOWN, but was insufficient for the common 3rd-finger transition path.
Also ships compositor-side TOUCH-DBG FORWARD / BLOCKED / CANCEL-CLIENT
tracing that was essential to diagnose this — the prior assumption
that touch_gesture_locked was sticky turned out wrong, but the logs
immediately revealed the real "first two forwarded, rest blocked"
pattern.
Addresses reviewer report that foot --server scroll/select breaks
after a touch gesture ("self-healed and then re-broke") — same root
cause.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Pure formatting changes (whitespace, line wrapping). No logic changes. Resolves rustfmt CI failure. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Adds a `TouchpadPinch { fingers, direction }` trigger alongside the
existing touchpad gestures. libinput emits pinch events natively for
2/3/4 fingers on touchpads, so the parser accepts fingers=2 here (vs
3+ for touchscreen which reserves 2-finger for client passthrough);
MIN_FINGERS gating is now per-family.
Classifier hooks into on_gesture_pinch_begin/update/end and fires the
bind once per gesture when `|scale - 1.0|` crosses a configurable
threshold. Scale-ratio units (not pixels) since libinput pre-normalizes
touchpad pinch data — new pinch-trigger-scale knob lives under
input.touchpad.gestures alongside swipe-trigger-distance, default 0.15.
Raw pinch events still forward to Wayland clients so app-side
pinch-to-zoom keeps working in parallel with the compositor bind.
Tagged binds emit discrete IPC GestureBegin/End events matching the
TouchpadTapHold pattern.
KDL syntax: `TouchpadPinch fingers=2 direction="in"`.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
9657614 to
9695817
Compare
|
This is beyond amazing! Will this allow me to change gesture of scrolling desktops from 3 fingers to 4 fingers? :) |
Yes, it designed to work both trackpad and touch screen to bind N fingers to any action. |
|
Have you decided on a direction with replacing/removing the tags stuff? And reconciling how to decide whether events should be forwarded to clients or not? |
|
One more question - currently, touchpad config is pretty limited - for example, I wanted to have slower scroll, but increase the acceleration of the scroll when moving my fingers fast and make the inertia scroll stop more aggressively (to match macos like behavior). Its currently impossible. Can such control be added / is planned to be added? If so, I'd be in heaven :) |
I have not decided, I really have just been enjoying the Touch* features. This question is bigger than me. I just implemented your idea as a proof of concept with tag and it works pretty decent. What is the best way to get gesture actions outside of the compositor for 3rd party apps to consume is important and how to go about it i'm not sure. I was hoping the other 'touchscreen' and 'touchpad' features would draw people in and get some ideas flowing. |
|
I'll probably provide some feedback soon |
I have not used a mac in 10+ years so i don't really understand the request. I will try get my hands on one and test. |
I think this is something that should be added on the libinput side. There's been a similar discussion about it in this thread, #3960. I wonder if this can achieved via libinput plugins. |
I did not realise how federated Linux is with inputs and would explain the disconnected scrolling feel in apps. Linux input is a federated pipeline with many ownership boundaries between fingertip and pixel. Each boundary owner makes locally reasonable decisions; the cumulative result is the segmented feel users notice between scrolling in apps. macOS centralizes the whole pipeline. AppKit owns scroll simulation; every app inherits one consistent feel. Apple isn't better at scroll. Apple owns the boundary. On Linux, post-lift kinetic ("inertia") lives in the app not the compositor, not libinput. You'll notice this everywhere: Firefox coasts, Thunar coasts, xfce4-terminal doesn't coast at all, and where coast exists the curve differs per app. Each app implements its own kinetic on top of the raw axis events the compositor forwards. This means a compositor-side change to the active-phase stream (a velocity curve, scroll multiplier) propagates uniformly to every app they all just apply deltaY linearly. But a compositor-side post-lift kinetic engine would double-stack with apps that already simulate their own, producing wrong feel in Firefox / Thunar / etc. I don't think you will ever get a macOS scroll feel. seems impossible. |
original post
Builds on #3180 — adds configurable settings for touch gestures.
Changes
finger-count,sensitivity, andofftoggle (touchpad& touchscreen)
recognition-thresholdfor gesture direction lockingnatural-scroll(touchpad inherits fromdevice-level libinput)
TouchpadSwipe{3,4,5}{Up,Down,Left,Right}—modifier + swipe fires discrete actions, bare swipes remain continuous
gestures
src/input/touch_gesture.rsAddresses #3180 and #372 — sensitivity too high, hardcoded gestures, no
way to disable or bind swipe gestures.
Settings UI available separately: niri-touch-settings-UI
Depends on: #3180
Feature Rich Demo

https://youtu.be/eQJSEfhol9c
Configurable touch gestures
Replaces the old hardcoded gesture set with a fully configurable touch
subsystem for both touchpad and touchscreen.
What's implemented
TouchSwipe,TouchPinch,TouchRotate,TouchEdge, andTouchpadSwipeas KDL nodes with properties, e.g.TouchSwipe fingers=3 direction="up".swipe for touchpad.
target e.g.
edge="right" zone="top"independently of the rest ofthe edge.
ratio, so rotate can win against swipe/pinch cleanly.
tag="...", and niri publishesbegin/progress/end events for that tag. External clients subscribe
without ever touching raw input (see companion repos below).
What users can change
Per bind:
up/down/left/right,in/out,cw/ccw)TouchEdge)noopwith atagfor pure IPCsensitivity,natural-scroll,tagGlobal tuning (in
input { touchscreen { gestures { … } } }and thetouchpad equivalent):
beat the others to win)
Companion tools
— GTK4 preferences app for every knob and bind above.
— generic drawer/progress-bar client driven by
tag=IPC.— live debug window for recognition state (debug builds only).